skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM ET on Friday, February 6 until 10:00 AM ET on Saturday, February 7 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Liu, Jonathan"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Dynamic Programming (DP) is commonly regarded as one of the most difficult topics in the upper-level algorithms curriculum. The teaching of metacognitive strategies may prove effective in helping students learn to design DP algorithms. To explore both whether students learn and use these strategies on their own and the effect of guidance about using these strategies, we conducted think-aloud interviews with structured guidance at two points in a college algorithms course: once immediately after students learned the concept and once at the end of the course. We explore 1) what metacognitive strategies are commonly employed by students, 2) how effectively they help students solve problems, and 3) to what extent structured guidance about using metacognitive strategies is effective. We find that these strategies generally help students make progress in solving DP problems, but that they can mislead students as well. We also find that the adoption of these strategies is an individualized process and that structured strategy guidance is often insufficient in allowing students to solve individual DP problems, indicating the need for more extensive strategy instruction. 
    more » « less
  2. Though the increased availability of Large Language Models (LLMs) presents signi!cant potential for change in the way students learn to program, the text-based nature of the available tools currently preclude block-based languages from much of that innovation. In an attempt to remedy this, we identify the strengths and weaknesses of using a transpiler to leverage the existing learning in commercially available LLMs and Scratch, a visual block-based programming language. Using only prompt engineering, we evaluate an LLM’s performance on two common classroom tasks in a Scratch curriculum. We evaluate the LLM’s ability to: 1) Create project solutions that compile and satisfy project requirements and 2) Analyze student projects’ completion of project requirements using natural language. In both cases, we !nd results indicating that prompt-engineering alone is insu"cient to reliably produce high-quality results. For projects of medium complexity, the LLM-generated solutions con- sistently failed to follow correct syntax or, in the few instances with correct syntax, produce correct solutions. When used for auto- grading, we found a correlation between scores assigned by the official Scratch Encore autograder and those generated by the LLM, nevertheless the discrepancies between the ‘real’ scores and the scores assigned by the LLM remained too great for the tool to be reliable in a classroom setting. 
    more » « less
  3. According to an ecological affordances perspective, any static curriculum has a set of affordances, and differences in teachers, students, and the teaching environment change how those affordances are viewed and used. Therefore, teaching is a relationship between the curriculum, the teacher, and the students. As such, it is not only possible but expected that a teacher will diverge from the details of a lesson plan to better accommodate the needs of themselves as a teacher and their students as learners. In this study, we report on a mixed-methods investigation that explores the different ways upper-elementary and middle-school (7- 13 y.o. students) teachers implement the Scratch-based TIPP&SEE learning strategy and the reasoning for their approaches. As expected, we find that teachers across grade levels often deviate from lesson plan details to cater to their own classrooms. For example, teachers serving younger grades were far more likely to keep scaffolds that lesson plans suggest removing. The varied degree of deviation suggests that the repeated use of a learning strategy, alongside lesson plans that present a variety of scaffolded implementations, is beneficial in enabling teachers to adapt lesson content to serve the needs of their specific classroom. 
    more » « less
  4. Quantum computing presents a paradigmatic shift in the field of computation, in which unintuitive properties of quantum mechanics can be harnessed to change the way we approach a wide range of problems. However, due to the mathematics and physics perspective through which quantum computing is traditionally presented, most resources are inaccessible to many undergraduate students, let alone the general public. It is thus imperative to develop resources and best-practices for quantum computing instruction accessible to students at all levels. In this paper, we describe the development and results of our Massive Open Online Course (MOOC) "Introduction to Quantum Computing for Everyone." This course presents an introduction to quantum computing with few technical prerequisites. In the first half of the course, quantum computing concepts are introduced with a unique, purely visual representation, allowing students to develop conceptual understanding without the burden of learning new mathematical notation. In the second half, students are taught the formal notation for concepts and objects already introduced, reinforcing student understanding of these concepts and providing an applicable context for the technical material. Most notably, we find that introducing the math content in the curriculum's second stage led to no drops in engagement or student performance, suggesting that our curriculum's spiral structure eased the technical burden. 
    more » « less
  5. In recent years, there has been a revived appreciation for the importance of spatial context and morphological phenotypes for both understanding disease progression and guiding treatment decisions. Compared with conventional 2D histopathology, which is the current gold standard of medical diagnostics, nondestructive 3D pathology offers researchers and clinicians the ability to visualize orders of magnitude more tissue within their natural volumetric context. This has been enabled by rapid advances in tissue-preparation methods, high-throughput 3D microscopy instrumentation, and computational tools for processing these massive feature-rich data sets. Here, we provide a brief overview of many of these technical advances along with remaining challenges to be overcome. We also speculate on the future of 3D pathology as applied in translational investigations, preclinical drug development, and clinical decision-support assays. 
    more » « less
  6. In recent years, technological advances in tissue preparation, high‐throughput volumetric microscopy, and computational infrastructure have enabled rapid developments in nondestructive 3D pathology, in which high‐resolution histologic datasets are obtained from thick tissue specimens, such as whole biopsies, without the need for physical sectioning onto glass slides. While 3D pathology generates massive datasets that are attractive for automated computational analysis, there is also a desire to use 3D pathology to improve the visual assessment of tissue histology. In this perspective, we discuss and provide examples of potential advantages of 3D pathology for the visual assessment of clinical specimens and the challenges of dealing with large 3D datasets (of individual or multiple specimens) that pathologists have not been trained to interpret. We discuss the need for artificial intelligence triaging algorithms and explainable analysis methods to assist pathologists or other domain experts in the interpretation of these novel, often complex, large datasets. 
    more » « less
  7. Abstract Prostate cancer treatment decisions rely heavily on subjective visual interpretation [assigning Gleason patterns or International Society of Urological Pathology (ISUP) grade groups] of limited numbers of two‐dimensional (2D) histology sections. Under this paradigm, interobserver variance is high, with ISUP grades not correlating well with outcome for individual patients, and this contributes to the over‐ and undertreatment of patients. Recent studies have demonstrated improved prognostication of prostate cancer outcomes based on computational analyses of glands and nuclei within 2D whole slide images. Our group has also shown that the computational analysis of three‐dimensional (3D) glandular features, extracted from 3D pathology datasets of whole intact biopsies, can allow for improved recurrence prediction compared to corresponding 2D features. Here we seek to expand on these prior studies by exploring the prognostic value of 3D shape‐based nuclear features in prostate cancer (e.g. nuclear size, sphericity). 3D pathology datasets were generated using open‐top light‐sheet (OTLS) microscopy of 102 cancer‐containing biopsies extractedex vivofrom the prostatectomy specimens of 46 patients. A deep learning‐based workflow was developed for 3D nuclear segmentation within the glandular epithelium versus stromal regions of the biopsies. 3D shape‐based nuclear features were extracted, and a nested cross‐validation scheme was used to train a supervised machine classifier based on 5‐year biochemical recurrence (BCR) outcomes. Nuclear features of the glandular epithelium were found to be more prognostic than stromal cell nuclear features (area under the ROC curve [AUC] = 0.72 versus 0.63). 3D shape‐based nuclear features of the glandular epithelium were also more strongly associated with the risk of BCR than analogous 2D features (AUC = 0.72 versus 0.62). The results of this preliminary investigation suggest that 3D shape‐based nuclear features are associated with prostate cancer aggressiveness and could be of value for the development of decision‐support tools. © 2023 The Pathological Society of Great Britain and Ireland. 
    more » « less
  8. Prostate cancer prognostication largely relies on visual assessment of a few thinly sectioned biopsy specimens under a microscope to assign a Gleason grade group (GG). Unfortunately, the assigned GG is not always associated with a patient’s outcome in part because of the limited sampling of spatially heterogeneous tumors achieved by 2-dimensional histopathology. In this study, open-top light-sheet microscopy was used to obtain 3-dimensional pathology data sets that were assessed by 4 human readers. Intrabiopsy variability was assessed by asking readers to perform Gleason grading of 5 different levels per biopsy for a total of 20 core needle biopsies (ie, 100 total images). Intrabiopsy variability (Cohen k) was calculated as the worst pairwise agreement in GG between individual levels within each biopsy and found to be 0.34, 0.34, 0.38, and 0.43 for the 4 pathologists. These preliminary results reveal that even within a 1-mm-diameter needle core, GG based on 2-dimensional images can vary dramatically depending on the location within a biopsy being analyzed. We believe that morphologic assessment of whole biopsies in 3 dimension has the potential to enable more reliable and consistent tumor grading. 
    more » « less